I have a confession. Box was the first app I ever used, on day one of my faculty professional development in tablet-based instructional approaches. That was in 2014, and the task was to record a video using the Box app. Many of my colleagues use it regularly in their classes, which is why I'm almost embarrassed to admit that it took me this long to discover a use for it: in my graduate course in molecular biology.
In its most simple form, Box (available as an app and via web interface) is a great place to collect and share files. It has a robust system for setting access permissions when inviting others to access files you place in a Box folder you create. This I already knew.
But, then I had an issue to address in class: my travel conflicted with one of our class meetings. Once again, I started exploring how the use of technology could enhance my ability to conduct course meetings remotely. The format of my molecular biology course is that students read a published research manuscript in advance of class, and then we meet to discuss, interpret, and critique the study. How would I provide students with a rendition of this format if we have one week where we don't meet in person?
There are several possible approaches, including using Zoom (or other videoconferencing platforms), but I settled on the idea of having each student provide a digital presentation of one figure or table from the assigned manuscript. Instead of discussing in person, instead each student would produce a video in which s/he presents and interprets a figure. The grading in my class is based entirely on participation, so it would be relatively easy for me to assess participation in a digital discussion.
Thus, I assigned each student one figure to present. They are allowed either to record a movie or (as a backup approach, in case of technical hurdles) to produce a written description and interpretation of that figure. I discovered, while re-exploring Box, that the Box app combines the ability to record and upload to the class Box folder a movie directly from the tablet - and the workflow is easy. More importantly, for me, Box also incorporates a commenting feature, and a way to post text.
Beyond posting one figure presentation (movie or text) to our Box folder, each student must view (and leave a critical comment or a question) one peer's presentation of each figure in the manuscript. This ensures that each student has engaged in the analysis of the data presented in the manuscript, as we do weekly in person in class. Finally, I also required that each student respond to at least one comment/question left by a peer on their own presentation.
Because students will use the Box Comment tool to leave (and respond to) both text and movie posts, this also makes it easy for me to assess student participation.
This plan kicks off in three days, and I'll post an update after I learn whether my best-laid plans work the way I hope!
My only concern is that, because the use of Box was a rather last-minute decision on my part, I didn't have time built into my course for students to practice using Box beforehand. This underscores an important best practice for any incorporation of a new technology into a class: build (and protect) time in your class to practice workflows before you employ them in any for-stakes assessment activities.
Saturday, November 19, 2016
Sunday, November 13, 2016
Google Sheets: Practicing the Principles of Deep Seq Alignment & Assembly
One of the more difficult concepts for me to help students initially grasp is how deep sequencing (next-generation sequencing) works, particularly in how millions of short (~35 nucleotide) DNA sequences can be used to identify the sequence of an entire chromosome (~millions of nucleotides). My perspective on why this topic is conceptually tricky is that deep seq involves huge amounts of data that are normally analyzed by "black box" computational algorithms. Thus, it has been intractable for me to develop in-class paper exercises that scaffold the process of deep seq assembly and engage each student in a group effort to solve an explicit problem.
Until DISCOVERe.
I'm currently teaching a graduate course in molecular biology as one of Fresno State's DISCOVERe (tablet computer) courses, in which each student is required to bring a tablet computer to class each week. As I've written previously, one way I'm leveraging this opportunity to improve student learning is by creating exercises that require students to use computers to understand concepts and solve problems.
I've written this post with appropriate biology background for the lay audience. Moreover, I've emphasized many aspects of the use of a shared spreadsheet that I think can be leveraged in disciplines other than biology. –JR
Last week, the primary literature article we read, for the topic of genome sequencing, was a manuscript, published in Nature, describing the genome assembly of a 45,000 year old modern human using a deep sequencing approach.
For you non-geneticists, here's the crash course on genome assembly:
Example of Deep Sequence Assembly
For example, let's say we were sequencing the chromosome containing the paragraph I just wrote. I'll focus on the blue clause. From deep sequencing, we might have obtained a series of sequences as follows (in no particular order, as a geneticist would obtain from a deep seq experiment):
beusedtoass, rlapsthatcou, kforsharedo, letheshorts, lookforsha, atcouldbeu, laredoverla, toassembleth, ortsequences
Note that spaces have been removed, as the DNA language uses other formatting to distinguish "words" (genes) from each other. Now, with these short reads, our task is to assemble them into a meaningful clause. As I mentioned, computer algorithms look for unique overlaps to line up the short reads like this:
Where each of rows 1–9 contain one of our short reads. The algorithms then assemble a "consensus sequence" (which we then define as the sequence of the chromosome) by looking for the most common letter at each position. Note, for example, that the short read in row 3 begins with an "L," but the other two reads that overlap this position (column I) contain "H." Thus, the consensus sequence assembly (row 11) contains the more common letter (H) in column I. The L in sequence three represents one of those mistakes that are accidentally generated during the experimental reading of the chromosome.
One other technique used to improve assembly efficiency
When working with chromosomes, we can give ourselves one more piece of information that helps computers speed up the process of sequence assembly: we take a chromosome and break it into random pieces that are all the same length, say 1,000 nucleotides long, and we read the first and last 35 nucleotides of each pieces. Thus, we know how far apart on each chromosome each pair of sequences should be placed (~1,000 nucleotides apart).
In this case, for efficiency, I made each sequence 100 nucleotides long and told the students that the paired sequences were from 300 nucleotide DNA fragments. Thus, in the final assembly of the Mc1r gene, each student has a unique 100 nucleotide sequence, followed by a gap of 100 (unknown) nucleotides (the middle of their DNA fragment), followed by 100 more nucleotides. I used the rand() function to generate a random distribution of fragments for my 20 students, and then shared the Google Sheet with all of them during class.
In this case, because we worked on this exercise in class and were not using computer algorithms to find overlaps among each others' sequence reads (Google Sheets is not well-equipped to do this), I provided the consensus sequence of the Mc1r gene in the top row of the sheet. My immediate goal was not to have them practice the deep sequence assembly part of the process, but rather to use a spreadsheet as a tool that gives students the ability to hand-align short sequences to a larger consensus sequence. The students used the shared Google Sheet to align their short sequences (rows 4–27) to the consensus (row 2) in real time by cutting and pasting each sequence into the appropriate series of columns (the gene is 954 nucleotides - or columns - long).
After the multiple sequence alignment is produced, one region looks like this:
where the top row is the known Mc1r consensus sequence, the second row is the consensus sequence derived from the short sequence reads the students are working with, and the bottom three sequences are individual deep seq reads. Note that position 919 has two different nucleotides: two short reads have a "G" (yellow), and one has an "A." I constructed this difference because our genome sequence manuscript noted that the ancient human genome contained a mutation in the Mc1r gene that might have produced red hair (a G instead of the usual A at position 919). Incorporated into this exercise, this finding allows students to discuss the concept of developing a consensus sequence and how low "read depth" (the consensus sequence at position 919 here is only generated from three independent sequences, only two of which agree) could produce a misleading consensus sequence.
1. Read depth and plot
At each nucleotide position (1, 2, …), we just want to know how many sequences contain a letter at that position. This is easily accomplished using the countif function. For example, to calculate read depth at position 918 (we'll assume that this information is in column 919, and the 22 deep seq reads are in rows C–X), use the countif() function:
=countif(C919:X919,"?")
where each cell, in the range C919:X919, that contains any character (specified by the wildcard variable ?) is tallied. The cell would display the number 3 (the read depth). This equation can be rapidly copied and pasted (filled across) to calculate read depth for every nucleotide position.
We can then plot the read depth across the sequence:
2. Maximum, minimum, and average read depths
Across all 954 nucleotides, we can quickly identify the maximum, minimum, and average read depths by entering the following three formulae into three different cells:
=max(<range-containing-read-depth-values>)
=min(<range-containing-read-depth-values>)
=average(<range-containing-read-depth-values>)
where <range-containing-read-depth-values> would be replaced with an expression like Z1:Z954, indicating the cell range where the read depth values are located. The brackets (< >) should not be included in the entry.
3. Mismatches to the consensus sequence
It can be time-consuming (and error-prone) to visually compare each 100 nucleotide sequence to the consensus sequence to find any places where the two do not agree (like at position 919). A faster way to accomplish this using a spreadsheet is with the if() function, where you specify a condition that has to be met. When true, one value you specify is displayed in the cell; when false, a different value is displayed:
=if(<cell-containing-deep-seq-nucleotide> = <cell-containing-consensus-seq-nucleotide>, 0, 1)
In this case, we're testing the condition where our deep sequence nucleotide matches (is equal to) the consensus nucleotide at that position. After the first comma, we enter the value-if-true (the value we want displayed in the cell if the two nucleotides match): in this case, a zero. After the second comma, we enter the value-if-false, which will be shown in every cell where the two nucleotides are not the same. Filling this formula across all 954 columns will let us visually scan across that row to spot any places where a "0" is shown, visually indicating a position of mismatch. There is a more advanced technique to even more quickly learn which columns contain a 0 and not a 1, but I'll leave that for another time.
At this point, I concluded the exercise by demonstrating one last web tool for use in analyzing DNA sequences: Transeq, which is a program that translates a DNA sequence in all six reading frames. I translate both the original consensus sequence and the student consensus (with G at 919), so that students observe how this nucleotide sequence difference impacts the sequence of the encoded protein.
We then summarize by discussing how difficult the entire process would have been - and how much longer it would have taken - if I had not initially provided the known consensus sequence, which is always what the genomicist is seeking to identify with deep sequence reads in the first place.
Until DISCOVERe.
I'm currently teaching a graduate course in molecular biology as one of Fresno State's DISCOVERe (tablet computer) courses, in which each student is required to bring a tablet computer to class each week. As I've written previously, one way I'm leveraging this opportunity to improve student learning is by creating exercises that require students to use computers to understand concepts and solve problems.
I've written this post with appropriate biology background for the lay audience. Moreover, I've emphasized many aspects of the use of a shared spreadsheet that I think can be leveraged in disciplines other than biology. –JR
Last week, the primary literature article we read, for the topic of genome sequencing, was a manuscript, published in Nature, describing the genome assembly of a 45,000 year old modern human using a deep sequencing approach.
For you non-geneticists, here's the crash course on genome assembly:
Background
DNA comprises four chemical building blocks, called nucleotides, abbreviated A, C, G, and T, that are linked together in long chains we call chromosomes. The 23 human chromosomes are millions of nucleotides long, and the specific order of nucleotides encodes the instructions for building and operating cells and organisms, much like the order of letters in this blog post gives you instructions for building and operating the exercise I used in my class! By analogy, each of the 23 chromosomes contains different genes, most of which have unique roles in building and operating a living organism, and each chromosome can be considered as one chapter that comprises the book of life. "DNA sequencing" is the process of obtaining the identity and order of the nucleotides that comprise a chromosome.
The Problems
- We do not have the technology to take an entire chromosome and read the sequence of nucleotides from one end to the other. Rather, we might be able to read, at most, a few thousand nucleotides in a row at a time.
- Technologies we use to determine the sequence of a nucleotide chain (a chromosome) are prone to occasionally mis-reading a nucleotide.
Current Solutions
- We use technologies, like deep sequencing, that don't rely on having to obtain long stretches of nucleotide sequence information
- Modern approaches like deep sequencing take advantage of obtaining multiple experimental sequence reads of the same spot on the same chromosome to identify errors. For example, if one nucleotide on a chromosome is experimentally "read" twenty times, with nineteen of the sequence reads containing an "A" at that position, and one read containing a "T," then we conclude that the rare nucleotide T is an experimental error that doesn't reflect the true nucleotide found at that position on a chromosome (the A). This most-frequent value is called the consensus.
§4: A New Problem
Now, these solutions raise a new problem in itself. Continuing the analogy of a chromosome representing a chapter of information about how to build and operate a human, short DNA sequences obtained using deep seq technology might only comprise a few "words" from a chromosome. The major challenge a decade ago was to develop computer algorithms that would take the millions of short nucleotide sequences and to look for shared overlaps that could be used to assemble the short sequences into a longer contiguous sequence of an entire chromosome.Example of Deep Sequence Assembly
For example, let's say we were sequencing the chromosome containing the paragraph I just wrote. I'll focus on the blue clause. From deep sequencing, we might have obtained a series of sequences as follows (in no particular order, as a geneticist would obtain from a deep seq experiment):
beusedtoass, rlapsthatcou, kforsharedo, letheshorts, lookforsha, atcouldbeu, laredoverla, toassembleth, ortsequences
Note that spaces have been removed, as the DNA language uses other formatting to distinguish "words" (genes) from each other. Now, with these short reads, our task is to assemble them into a meaningful clause. As I mentioned, computer algorithms look for unique overlaps to line up the short reads like this:
Example of a multiple sequence alignment |
Where each of rows 1–9 contain one of our short reads. The algorithms then assemble a "consensus sequence" (which we then define as the sequence of the chromosome) by looking for the most common letter at each position. Note, for example, that the short read in row 3 begins with an "L," but the other two reads that overlap this position (column I) contain "H." Thus, the consensus sequence assembly (row 11) contains the more common letter (H) in column I. The L in sequence three represents one of those mistakes that are accidentally generated during the experimental reading of the chromosome.
One other technique used to improve assembly efficiency
When working with chromosomes, we can give ourselves one more piece of information that helps computers speed up the process of sequence assembly: we take a chromosome and break it into random pieces that are all the same length, say 1,000 nucleotides long, and we read the first and last 35 nucleotides of each pieces. Thus, we know how far apart on each chromosome each pair of sequences should be placed (~1,000 nucleotides apart).
The classroom exercise
I imagine that by now you're starting to appreciate the depth of the issue: the degree of complexity inherent in producing a chromosome-length sequence assembly comprising millions of short sequence reads. To help my students get a feel for this process, I generated a problem similar to the Google Sheets screen shot above. In this exercise, I took the known DNA sequence of the human melanocortin-1 receptor (Mc1r) gene, which is specifically relevant to the manuscript we read, and divided it into simulated deep seq reads. I will note here, though, that I didn't tell the students what gene this was - this is how I framed the activity: that we had obtained deep sequence reads from an unknown part of an ancient genome, and our job was to assemble the short reads and then determine the identity of the region of the genome we had sequenced.In this case, for efficiency, I made each sequence 100 nucleotides long and told the students that the paired sequences were from 300 nucleotide DNA fragments. Thus, in the final assembly of the Mc1r gene, each student has a unique 100 nucleotide sequence, followed by a gap of 100 (unknown) nucleotides (the middle of their DNA fragment), followed by 100 more nucleotides. I used the rand() function to generate a random distribution of fragments for my 20 students, and then shared the Google Sheet with all of them during class.
In this case, because we worked on this exercise in class and were not using computer algorithms to find overlaps among each others' sequence reads (Google Sheets is not well-equipped to do this), I provided the consensus sequence of the Mc1r gene in the top row of the sheet. My immediate goal was not to have them practice the deep sequence assembly part of the process, but rather to use a spreadsheet as a tool that gives students the ability to hand-align short sequences to a larger consensus sequence. The students used the shared Google Sheet to align their short sequences (rows 4–27) to the consensus (row 2) in real time by cutting and pasting each sequence into the appropriate series of columns (the gene is 954 nucleotides - or columns - long).
Screenshot taken while students were editing the shared Google Sheet |
After the multiple sequence alignment is produced, one region looks like this:
Example analysis of a multiple sequence alignment |
where the top row is the known Mc1r consensus sequence, the second row is the consensus sequence derived from the short sequence reads the students are working with, and the bottom three sequences are individual deep seq reads. Note that position 919 has two different nucleotides: two short reads have a "G" (yellow), and one has an "A." I constructed this difference because our genome sequence manuscript noted that the ancient human genome contained a mutation in the Mc1r gene that might have produced red hair (a G instead of the usual A at position 919). Incorporated into this exercise, this finding allows students to discuss the concept of developing a consensus sequence and how low "read depth" (the consensus sequence at position 919 here is only generated from three independent sequences, only two of which agree) could produce a misleading consensus sequence.
Conclusion One
A benefit of using a spreadsheet like Google Sheets for this type of activity is that students can work interactively with more data, hand-aligning multiple sequences simultaneously by cut-and-paste (or drag-and-drop). Getting a grasp of the solution to the problem, as well as the magnitude of the problem in real life situations, is next to impossible (in my hands) to convey using a traditional paper-based exercise, or (worse) by drawing on a static white board.
Additional exercise components
Another benefit of using a Google Sheet, instead of showing students existing web-based sequence alignment tools, is that the interface is more interactive, intuitive, and also lets us easily practice calculating some basic information about our alignment. With our spreadsheet-based multiple sequence alignment, produced collaboratively by all of the students, we can:- Calculate and plot the read depth of the sequence at every position
- Find the average, maximum and minimum read depths
- Quickly identify positions where a deep sequence read doesn't match the consensus
1. Read depth and plot
At each nucleotide position (1, 2, …), we just want to know how many sequences contain a letter at that position. This is easily accomplished using the countif function. For example, to calculate read depth at position 918 (we'll assume that this information is in column 919, and the 22 deep seq reads are in rows C–X), use the countif() function:
=countif(C919:X919,"?")
where each cell, in the range C919:X919, that contains any character (specified by the wildcard variable ?) is tallied. The cell would display the number 3 (the read depth). This equation can be rapidly copied and pasted (filled across) to calculate read depth for every nucleotide position.
We can then plot the read depth across the sequence:
2. Maximum, minimum, and average read depths
Across all 954 nucleotides, we can quickly identify the maximum, minimum, and average read depths by entering the following three formulae into three different cells:
=max(<range-containing-read-depth-values>)
=min(<range-containing-read-depth-values>)
=average(<range-containing-read-depth-values>)
where <range-containing-read-depth-values> would be replaced with an expression like Z1:Z954, indicating the cell range where the read depth values are located. The brackets (< >) should not be included in the entry.
3. Mismatches to the consensus sequence
It can be time-consuming (and error-prone) to visually compare each 100 nucleotide sequence to the consensus sequence to find any places where the two do not agree (like at position 919). A faster way to accomplish this using a spreadsheet is with the if() function, where you specify a condition that has to be met. When true, one value you specify is displayed in the cell; when false, a different value is displayed:
=if(<cell-containing-deep-seq-nucleotide> = <cell-containing-consensus-seq-nucleotide>, 0, 1)
In this case, we're testing the condition where our deep sequence nucleotide matches (is equal to) the consensus nucleotide at that position. After the first comma, we enter the value-if-true (the value we want displayed in the cell if the two nucleotides match): in this case, a zero. After the second comma, we enter the value-if-false, which will be shown in every cell where the two nucleotides are not the same. Filling this formula across all 954 columns will let us visually scan across that row to spot any places where a "0" is shown, visually indicating a position of mismatch. There is a more advanced technique to even more quickly learn which columns contain a 0 and not a 1, but I'll leave that for another time.
The punch line of the exercise
At the end of the exercise, the students are able to copy the consensus sequence into the web-based Basic Local Alignment Search Tool (BLAST) to search a nucleotide sequence database for matches to known sequences. This is where students discover that the sequence they've been working with is the human Mc1r gene. Finally, based on the local alignment in the figure above, they work together to identify which of the individual deep seq reads came from a modern human (A at position 919) and which from the ancient human (G at position 919).At this point, I concluded the exercise by demonstrating one last web tool for use in analyzing DNA sequences: Transeq, which is a program that translates a DNA sequence in all six reading frames. I translate both the original consensus sequence and the student consensus (with G at 919), so that students observe how this nucleotide sequence difference impacts the sequence of the encoded protein.
We then summarize by discussing how difficult the entire process would have been - and how much longer it would have taken - if I had not initially provided the known consensus sequence, which is always what the genomicist is seeking to identify with deep sequence reads in the first place.
Conclusion
By using a shared spreadsheet instead of a paper process, students can tackle an involved problem more quickly by working together on independent tablet computers. Further, I integrated the use of several web tools that practicing geneticists use on a regular basis as part of their research. The use of a spreadsheet to make calculations and graphs and to analyze data helps students develop quantitative analysis skills that they can apply in other courses (and, of course, in life).
Sunday, November 6, 2016
Syllabi: mobile tech and the digital divide
I cannot believe how quickly this semester is flying by (and, likewise, how long it has been since I posted here!) My forays into tablet pedagogy have brought new opportunities (read: additional work), including serving as co-Chair of Fresno State's DISCOVERe Taskforce subcommittee on assessment, as well as now being the co-Coordinator of one of campus' newest Faculty Learning Communities: Advanced DISCOVERe. In fact, in both of these "co-" situations, my colleague Mary Paul and I are working together to help identify and then disseminate the value that educational technology can provide both students and instructors. Watch us discuss this briefly here.
As I teach more tablet-based classes, I have identified some specific items that I suggest including on any syllabus for a class that intends to have all students use any type of mobile technology (i.e. smartphones, tablets, laptops), whether it is a program like DISCOVERe or whether it is a BYOD (bring your own device) situation. I have written about such syllabi before, but here are some new thoughts:
If you intend on having students use technology to complete any assignment, be firm and state on your syllabus that the technology must be used, if that is your philosophy and truly your intention. If you can't defend the use of the technology over a traditional (e.g. paper) process, then the use of technology might not be warranted.
Even in the DISCOVERe program, where each student is required to bring a tablet to each class meeting, it is not uncommon to find smartphones and/or laptops being substituted. If, as the instructor, one feels particularly strongly about allowing one or two of these technologies, but not the other(s), then it is good to clearly articulate in the syllabus what will happen should a student not be using their tablet during class.
I've described before (e.g. here) how I moved test-taking into the digital realm, requiring students to annotate a PDF of the exam. What I realized at the end of last semester was that students were taking two liberties that I hadn't explicitly dealt with in my syllabus, and so (at the time), I felt like I had no recourse to intervene. Those were:
I hope that providing these thoughts will help you strategize what you place in your syllabus for the 21st century classroom!
As I teach more tablet-based classes, I have identified some specific items that I suggest including on any syllabus for a class that intends to have all students use any type of mobile technology (i.e. smartphones, tablets, laptops), whether it is a program like DISCOVERe or whether it is a BYOD (bring your own device) situation. I have written about such syllabi before, but here are some new thoughts:
If you intend on having students use technology to complete any assignment, be firm and state on your syllabus that the technology must be used, if that is your philosophy and truly your intention. If you can't defend the use of the technology over a traditional (e.g. paper) process, then the use of technology might not be warranted.
Even in the DISCOVERe program, where each student is required to bring a tablet to each class meeting, it is not uncommon to find smartphones and/or laptops being substituted. If, as the instructor, one feels particularly strongly about allowing one or two of these technologies, but not the other(s), then it is good to clearly articulate in the syllabus what will happen should a student not be using their tablet during class.
I've described before (e.g. here) how I moved test-taking into the digital realm, requiring students to annotate a PDF of the exam. What I realized at the end of last semester was that students were taking two liberties that I hadn't explicitly dealt with in my syllabus, and so (at the time), I felt like I had no recourse to intervene. Those were:
Use of multiple devices
I occasionally saw students using a laptop and tablet, and/or smartphone, during an exam. This gave me the uneasy feeling that some backchannel communication might have been going on during the exam. If you want to limit student digital access to each other during an exam, limiting the number of devices allowed to be used might at least make it less efficient to carry on digital conversations with others.
Use of tablet keyboards
Some students had purchased external keyboards for their tablets - something not required in the DISCOVERe program. This made me immediately concerned about whether this was putting some students (who might not be able or willing to afford that extra expense) at a disadvantage. I currently ban the use of audio during the course, because I feel that I would have to require students to use earphones so as not to distract other students. I don't think that requiring every student to purchase headphones is reasonable (even if most already have them - some will not), and so I am currently pondering whether to ban external keyboards as well.I hope that providing these thoughts will help you strategize what you place in your syllabus for the 21st century classroom!
Wednesday, September 21, 2016
Google Sheets: share a sheet or distribute copies with Google Classroom?
I can't believe I've had a three-month hiatus from posting! Academic life intervened, you know: with an active federal research grant, more of my time is dedicated to mentoring students in my research laboratory now; I'm only teaching one class this fall semester. This is both good and bad for this blog and for my exploration of the application of mobile technology to pedagogy.
Bad: I spend less time this term focused on this topic.
Good: with only one class to prep, which I've taught before (a graduate course in molecular biology, based almost entirely on reading and critically evaluating published research literature), all of my efforts are highly focused on leveraging tablets in this course.
I've taught one other graduate course as a DISCOVERe course before (DISCOVERe is Fresno State's initiative to teach courses in which every student is required to use a tablet computer during class). Transforming this molecular biology course has been different, though, because of the learning outcomes, which include:
Bad: I spend less time this term focused on this topic.
Good: with only one class to prep, which I've taught before (a graduate course in molecular biology, based almost entirely on reading and critically evaluating published research literature), all of my efforts are highly focused on leveraging tablets in this course.
I've taught one other graduate course as a DISCOVERe course before (DISCOVERe is Fresno State's initiative to teach courses in which every student is required to use a tablet computer during class). Transforming this molecular biology course has been different, though, because of the learning outcomes, which include:
- scientific information literacy (identifying and accessing primary literature sources)
- quantitative analysis and reasoning; critical thinking
Leveraging tablets for the former has been relatively straightforward: we explored how to locate primary research literature on the web. Integrating the latter with a digital approach has been a blast to work on so far (now five weeks into the semester)! My goal now is to share with you a couple of ideas of how to use Google Sheets to help students
- appreciate biology
- peer-evaluate information literacy skills
- analyze and understand the limitations of published research
Week 4: Appreciate Biology and Peer-Evaluate Information Literacy
The topic for the class period: DNA replication
To help students get a sense for why studying DNA replication is important, I asked them to use their tablets and the web to find a few pieces of information. The students entered these data onto a collaborative (shared) Google Sheet, which I reformatted and had entered each student's name in adjacent columns:
Each student worked to find published evidence of
the size of a chromosome from a species of their choosing (in base pairs)
how quickly chromosomes from a species are replicated (in base pairs per minute)
how rapidly cells from different species divide (in minutes). After students had edited this Sheet, we were able to
- perform calculations (to fill in the red cell, above) that revealed that, in theory, it should take DNA polymerase (the enzyme that replicates chromosomes) longer to replicate a chromosome than a cell takes to divide (which would be a serious issue for our cells, if that were true)
- see where all of the students were obtaining their information (we then had a discussion about the appropriateness of using different sources)
- assess whether independent data sources all pointed to the same fact: that there is a discrepancy between chromosome replication times and cell division times
I emphasize here independent data sources in particular, because my opinion (based on my experience as a student) is that teachers should have students develop the examples used in class to remove any question of whether the teacher's favorite case study is an outlier and doesn't generally represent any general trend. This is a great use of mobile technology that can be used to search the web for information!
This process didn't work flawlessly (which it never does the first time around). I ran into two issues with deploying this exercise. Please, learn from my mistakes!
- I didn't provide the students enough in-class time to perform the entire exercise (which is why the screen shot above shows that the data entry was not completed)
- I shared the URL to the shared Google Sheet via a goo.gl shortened URL, displayed to the class on the video projector; some of the students had issues entering the URL into their web browsers, mainly because of ambiguities such as whether a character in the URL was a lower-case L or a numeral one, or a zero vs. a letter o, and so on. While I also projected a QR code that students could capture with a QR code scanner app, it wasn't apparent that anybody used this approach (which probably would have been preferable).
Week 5: Analyze and understand the limitations of published research
Like the example above, I created an outline of a Google Sheet, for students to fill in information and to practice performing calculations within the Google Sheet. The long-term goal is to get students used to the practice of being skeptical about the methods and interpretations of others (i.e critical analysis): to use the raw data presented in a research manuscript and to perform and interpret calculations in the ways the students think are best and to compare the outcomes to the published interpretation (i.e. quantitative reasoning).
In this case, authors of a published manuscript had essentially indicated that, in photographs of chromosomes, every micrometer (µm) of chromosome image comprised 423,077 base pairs of DNA:
My goal was to prompt students to question the accuracy of this very precise(-seeming) value. So, the Google Sheet I developed included the length in pixels, and a conversion factor (pixels/µm) based on the authors' paper. Together with the published conversion factor (bp/µm), the students calculated the predicted (for more than the one chromosome shown in the above image) length of the chromosome, measured in base pairs of DNA (entered in the blank red cell), and compared that value to the published value (number in the other red cell). It became rapidly clear that the process that the authors had used to estimate the sizes of chromosomes was imprecise.
For this exercise, instead of a collaborative worksheet, I used Google Classroom to distribute each student her/his own copy of the sheet to edit individually. This process seemed to work more smoothly than a group-edited sheet, with the drawback that students couldn't as easily see each others' work.
In sum, shared Google Sheet vs. distributing each student her/his own copy of a Google Sheet (using Google Classroom)? It depends on the goal of the lesson: if peer evaluation is involved, the shared sheet makes sense; if not, having Google Classroom produce copies of a sheet for each student is the way to go!
Thursday, June 23, 2016
Cal. State Univ. Course Redesign with Tech. - Day 3 Summary
The highlights of yesterday's sessions included one on inclusivity, a student Q&A panel, an accessibility presentation, and a sneak peek into assessment data from last year's CRT projects. I'll touch briefly on my takeaways.
The presentation focused on how we can engage the diversity of students (again, sense lato: age, ethnicity, sexual orientation…) in our courses. I focused my attention specifically on how we need to use examples, case studies, language that is inclusive. One goal is to make material relevant to students, and to do that we need to know about our students and where they are coming from (both literally and figuratively). One practice that I use, which I think addresses this need, is to use exit tickets (brief written responses from students at the end of a class period) to ask students:
This approach not only makes students try to imagine how to apply the material but also provides me with a list of examples of how the material is personally relevant to all of my students. I can use these examples/case studies in the following class period as part of a review, as well as in the following semesters. This approach gives me the opportunity to view the course material through the lens of each student's self-identity, rather than relying on my perception (through the lens of my biases) of student identity.
The top three student suggestions with the broadest consensus:
One of the pain points is (and will be, for the immediate future at least) how to make visual media accessible. The short-term solution: double-code everything. This is the same principle as using meta-tags for providing text descriptions of graphics, so that individuals who cannot see them can read about what the image contains/describes. When teaching, instead of using non-specific/non-descriptive phrasing like, "See how these photos show Donald Trump's hairstyle changing over time?" a more accessible version would include the instructor actually verbally describing what they see. This improves accessibility for all, which is the goal of accessibility, because it makes clear (to all of your students) the pattern you're hoping to point out. A second solution, which will take longer to roll out, is to use 3-D printers to make tactile models.
Overall
The most broad pattern was that much of the day didn't seem to focus greatly on technology specifically, but just best practices for teachers to improve student success in general.
Inclusivity
Consider whether/how your incorporation of technology will impact inclusion (sensu lato). The most obvious example would be: if you post videos of all of your lectures, do all of your students have equal access to the technology needed to view those videos?The presentation focused on how we can engage the diversity of students (again, sense lato: age, ethnicity, sexual orientation…) in our courses. I focused my attention specifically on how we need to use examples, case studies, language that is inclusive. One goal is to make material relevant to students, and to do that we need to know about our students and where they are coming from (both literally and figuratively). One practice that I use, which I think addresses this need, is to use exit tickets (brief written responses from students at the end of a class period) to ask students:
- How can you envision the material from class today being relevant to you (or to society)?
This approach not only makes students try to imagine how to apply the material but also provides me with a list of examples of how the material is personally relevant to all of my students. I can use these examples/case studies in the following class period as part of a review, as well as in the following semesters. This approach gives me the opportunity to view the course material through the lens of each student's self-identity, rather than relying on my perception (through the lens of my biases) of student identity.
Student Panel
With questions from the faculty redesigning their courses, topics ranged widely. The opinions of three students, who were willing to attend an academic conference during the summer (i.e. not necessarily representative of all students), suggested that faculty should:- Use e-mail to communicate with their classes (as opposed to social media)
- Tell students in advance what will happen each class period so they can prepare
- Be personable - make yourself human (i.e., divulge a little personal information - do you have kids? what are your hobbies? Not extensively - no oversharing - but show that you're more than a brainiac)
- Be sympathetic to the student condition (potential influences of housing and food insecurity, necessity of work and family commitments, etc., on their attendance, appearance, performance)
- Not be overly concerned about how they (faculty) dress
The top three student suggestions with the broadest consensus:
- Provide not-filled-in lecture slides in advance of class, for students to take notes on
- Audio and video record and post each lecture
- Provide opportunities for virtual office hours, particularly at times students are most likely to need it (evenings, just before high-stakes assessments)
Accessibility
After a fantastic side-by-side comparison by a visually impaired student of the efficacy of using a screen reader to navigate a poorly-designed vs. well-designed syllabus, the importance of making syllabi accessible was clear. The question I asked the presenters after their session was: why is the focus (at least on my campus) always about accessible syllabi? The answer, as I had predicted, was: because it is something that every faculty member is required to produce. So, it is an easy starting point. But, the conversation will be moving beyond just a myopic focus on getting faculty at least to make accessible syllabi.One of the pain points is (and will be, for the immediate future at least) how to make visual media accessible. The short-term solution: double-code everything. This is the same principle as using meta-tags for providing text descriptions of graphics, so that individuals who cannot see them can read about what the image contains/describes. When teaching, instead of using non-specific/non-descriptive phrasing like, "See how these photos show Donald Trump's hairstyle changing over time?" a more accessible version would include the instructor actually verbally describing what they see. This improves accessibility for all, which is the goal of accessibility, because it makes clear (to all of your students) the pattern you're hoping to point out. A second solution, which will take longer to roll out, is to use 3-D printers to make tactile models.
Assessment Data
Based on survey results of students who have taken courses redesigned as part of the CRT program, the following best approaches for faculty using technology to redesign courses were distilled by the Chancellor's Office (n.b. these are still preliminary results and from attitude/opinion questions, not from measures of student success like DFW rates):
- Record and post lecture videos
- Flip the classroom by pre-recording short lectures students must watch before attending class
- Use clickers to maintain student engagement and collect formative assessment data
- Provide digital collaborative opportunities, e.g. via shared Google Docs
My reflections on these preliminary findings:
- I'm extremely happy to find that many of the approaches I've focused on over the last three years are represented in this list
- lecture capture (and see here for response to the concern that an instructor might make a mistake during class and have it recorded)
- anonymous student polling
- Google Sheets and Google Docs for collaboration
- Only one of these four practices involve in-class instructor (or student) use of technology. My initial reaction to this is that faculty probably resist having to rely on technology in the classroom and are most comfortable (not surprisingly) using the technology outside of class. I just ask faculty to remember that not only does having tech failure occur during class help us show our students that we, too, are human and fallible, but it also helps us model adaptation to adversity to our students
Wednesday, June 22, 2016
Cal. State Univ. Course Redesign with Tech. - Day 2 Summary
Yesterday's program involved CSU faculty presenting on best practices in Course Redesign with Technology (CRT) and then a vendor faire and breakout sessions on various classroom technologies. Here are my biased impressions about the content, particularly related to the products the vendors offer, ridiculously distilled into a few points:
- Captioning technology can provide more than just monospace font family letters on a black background on your video (including interactive text transcripts, and a text search feature that graphically shows students the density of the search results as a function of temporal position within a video). I'm looking forward to using automaticsync.com
- Faculty care a lot about academic dishonesty (i.e. cheating on online quizzes/exams), and the CSU system has contracts with two vendors (ProctorU and Proctorio) that use technology to administer and proctor online assessments
- Digital collaboration and feedback are ways for faculty to leverage technology to, presumably, improve student engagement and faculty efficiency. The former (demonstrated via Zoom yesterday) might involve videoconferencing for faculty office hours, to support students who can't be physically present during your scheduled office hours, for example. Zoom also has a shared whiteboard feature, and meetings can be recorded (e.g. for distributing to those who couldn't even attend remotely). The latter might involve enhanced ways to provide verbal (recorded) comments on student written work.
I have two mild concerns (just to play devil's advocate) related to these last two uses of technology:
- Isn't cheating really just collaboration? Identifying and/or preventing cheating perhaps should not be our focus; rather, maybe we should encourage it - this might give us an opportunity to control and appropriately channel the efforts some students go through to get good grades on tests.
- I regularly hear faculty use the phrase "meet the students where they're at." I think I appreciate where this sentiment is coming from, but I also wonder the extent to which we're pandering and exacerbating existing trends…we need to strike a balance between support and rigor.
Tuesday, June 21, 2016
Cal. State Univ. Course Redesign with Tech. - Day 1 Summary
Now that day one of the California State University (CSU) Course Redesign with Technology (CRT) 2016 summer institute is behind us, I'm going to summarize
1) Setting student expectations is critical. Tell them regularly that your course will be unlike any they've ever had before. Explain (with or without data) that your goal is to teach them skills (not just content) they need to know to survive in any workforce, such as:
and therefore that your course will not be based on rote memorization and information regurgitation on exams
2) Spend time at the start of the course talking about how to take notes, especially for videos! Students will watch videos passively (like T.V.), so it is critical to get them practicing taking notes during videos. Richard Levine: it might be a best practice to have public viewing of pre-lecture videos with a TA present, and/or to ask students to watch pre-lecture videos in groups
3) Make online lecturing "interactive" - Ji Son: break up pre-lecture videos by inserting questions (using zaption.com, for example)
4) Be prepared, have a backup plan, be flexible. Model to students how to adapt whenproblems challenges opportunities arise. Flexibility includes not being tied to a single workflow or app (a point I'll make again below): students, many of whom are digital natives, will probably discover (or already know) more efficient methods of doing things on computers/tablets/smartphones than the instructor is aware of. This one is really difficult, but try to accommodate this diversity.
5) Gamify education to engage students (e.g. Mary-Pat Stein from CSUN recommends https://getkahoot.com); Estelle Eke uses mobile tech to run in-class student "competitions" where students are placed in anonymous pairs and compete against each other to answer a series of questions.
1) A recurring concern across the day: faculty use flipped classroom approaches and students don't come prepared to class, even with low-stakes, frequent assessment to attempt to motivate students to do the work in advance
2) Students coming into a course with one or both mindset issues about the class topic:
3) Alternative conception that a flipped class means faculty can cover less content AND that it requires students to do more work.
1) Is a consistent classroom approach good or bad? Each day, the structure of my class is (to date) exactly the same. An "entry quiz" formative assessment, followed by discussing the answers to those questions. Then, brief mini-lectures on topics as necessary, followed by student activities/active learnings, followed by an "exit activity" (reflection, exit ticket, summative assessment)… Pro: students know what to expect. Con: students know what to expect.
2) How many apps/approaches constitute too much redesign at once for students who are new to tech-redesigned courses?
3) Should students be placed in groups or form their own groups?
4) For the flipped classroom approach (with pre-lecture videos), is it important for the student to see the instructor in the video (e.g. with Learning Glass from SDSU) or is screen capture with audio voiceover OK?
Principles & Practices & Problems & Questions & Goals
Principles (emergent principles of efficacious CRT)
- CRT can help engage students (but active learning ≠ employing ed tech - active learning can be computer-free)
- Thus, it is critical to develop a sense of when to use tech and when not to
- CRT is being leveraged broadly by faculty in high-enrollment classes to improve workflow efficiency (at its most simple, management processes like taking roll, distributing and collecting assignments)
- In a flipped or hybrid course, an instructor can run the course remotely (e.g. while I'm traveling to conferences, or sick, or… I can provide students with reading and video lecture assignments, as well as digital collaborative group assignments…as long as I have cell service or wi-fi where I am)
These last two points can be leveraged as the "carrot" for expanding CRT efforts beyond early-adopter faculty
Practices (best practices distilled from my experiences in CRT + talks yesterday)
- becoming lifelong self-learners (i.e. practicing how to educate themselves)
- information literacy: how to find and assess the quality of information
- work effectively with others
- make evidence-based decisions
and therefore that your course will not be based on rote memorization and information regurgitation on exams
2) Spend time at the start of the course talking about how to take notes, especially for videos! Students will watch videos passively (like T.V.), so it is critical to get them practicing taking notes during videos. Richard Levine: it might be a best practice to have public viewing of pre-lecture videos with a TA present, and/or to ask students to watch pre-lecture videos in groups
3) Make online lecturing "interactive" - Ji Son: break up pre-lecture videos by inserting questions (using zaption.com, for example)
4) Be prepared, have a backup plan, be flexible. Model to students how to adapt when
5) Gamify education to engage students (e.g. Mary-Pat Stein from CSUN recommends https://getkahoot.com); Estelle Eke uses mobile tech to run in-class student "competitions" where students are placed in anonymous pairs and compete against each other to answer a series of questions.
Problems (things to discuss with colleagues and cohorts)
2) Students coming into a course with one or both mindset issues about the class topic:
- I already learned it (thus, we faculty need to make the topic new/different/advanced/relevant)
- I already failed to learn it, so I can't (fixed vs. growth student mindset)
3) Alternative conception that a flipped class means faculty can cover less content AND that it requires students to do more work.
Questions (do you have answers? please post a comment!)
2) How many apps/approaches constitute too much redesign at once for students who are new to tech-redesigned courses?
3) Should students be placed in groups or form their own groups?
4) For the flipped classroom approach (with pre-lecture videos), is it important for the student to see the instructor in the video (e.g. with Learning Glass from SDSU) or is screen capture with audio voiceover OK?
Goals (for me, for the future, inspired by presentations yesterday)
Simple (but very complicated): to teach a CRT class that doesn't rely at all on using any specific, proprietary apps or technology. I don't like relying on developers to keep their apps in existence, or updated for new operating system releases, or their availability on all possible operating systems and tablet/smartphone/computer platforms. This goal is part of the principle that faculty teaching tech-redesigned courses maintain flexibility: critical for dealing with unexpected issues that might arise one day during class. Plus, I'd like to keep student costs as low as possible, and divesting from specific apps, websites, platforms might help long-term success when, after using a free app for three years (and becoming highly dependent on it), the developer decides to start charging for it. Yes, this has happened to me!
Monday, June 20, 2016
Cal. State Univ. Course Redesign with Technology - Day 1 Plans
I'm currently in San Diego at the California State University's summer institute in the Course Redesign with Technology program. I'm a '16-'17 Proven Lead instructor, and some of you reading this are deciding whether to join my cohort as a Proven Adopting instructor.
You can see my e-portfolio (in progress) here: http://tinyurl.com/1617leadross
Faculty are opting to join my cohort (along with biology discipline co-Lead Nicole Bournias-Vardiabasis) for one of two reasons:
and join in the conversation. This will be one location (both now and after we depart San Diego) we can have active discussions.
Today (Day 1 of the institute), we're meeting at lunch with prospective cohort members. I'm looking forward to meeting with the Proven Adopting faculty! This evening, we have an opportunity to go to dinner together. I'm proposing to leave at 5:15 from the Sheraton Bay Tower lobby and to walk (1.5 mi each way) to Issara (Thai), if you'd like to join in. It seems like this is a good approach to wise spending of our dinner per diem, since we can share entrées if we decide to do so:
- Joe Ross
You can see my e-portfolio (in progress) here: http://tinyurl.com/1617leadross
Faculty are opting to join my cohort (along with biology discipline co-Lead Nicole Bournias-Vardiabasis) for one of two reasons:
- Interested in pedagogical methodologies/approaches that I (we) use, or
- Interested in developing a cohort of disciplinary colleagues who are actively participating in course redesign
and join in the conversation. This will be one location (both now and after we depart San Diego) we can have active discussions.
Today (Day 1 of the institute), we're meeting at lunch with prospective cohort members. I'm looking forward to meeting with the Proven Adopting faculty! This evening, we have an opportunity to go to dinner together. I'm proposing to leave at 5:15 from the Sheraton Bay Tower lobby and to walk (1.5 mi each way) to Issara (Thai), if you'd like to join in. It seems like this is a good approach to wise spending of our dinner per diem, since we can share entrées if we decide to do so:
- Joe Ross
Friday, March 18, 2016
Tablet test troubles travel in threes
Hi all,
I'm just writing three quick thoughts as I proctor my DISCOVERe tablet course's (BIOL 102: Genetics) second exam.
1) When having students complete open-note exams using tablets (by annotation of a PDF version of the exam), a vast number of students bring and access written notes. Some students do access digital notes and materials (i.e. the web) as well, but I'm hearing a lot of paper-shuffling.
2) I had an extreme Google Classroom fail last night (the type of professor nightmares!) I uploaded the exam PDF to the Classroom assignment I use for distributing and collecting the exams. I used one drop-down menu in Classroom to choose the "Make a copy for each student" (i.e., Classroom will send each student his/her own copy of the attached PDF exam file). Then, I (swear I) clicked on the downward-pointing triangle next to the "Assign" button (to access the "Save as Draft" feature), but Classroom just flat-out assigned the exam. Maybe I was tired and not mousing accurately… so at about 9 pm the day before the exam, Classroom sent the exam! Yes, I willingly acknowledge that a computer didn't do anything I didn't physically instruct it to.
Of course, I immediately deleted the assignment. To Google's credit (and perhaps for your future use), I learned that the student copy Classroom sends each student only gets created when the student loads the assignment on their computer! Thus, I was able to determine that a single student (whose name I know because Google Classroom adds the student name to each filename) was, indeed, able to access the exam in that ten-second span when I accidentally assigned it and then deleted it. To her extreme credit, she had already e-mailed me immediately to let me know she had accessed the file, not having realized what it was (honesty is the best policy - always!)
3) Here's the potentially bigger concern for me, and if you have a suggestion, I'd love to hear it! My brainstorm has fizzled into a drizzle at this point. During each exam, I keep a tally of students physically in the classroom. Right now, almost half-way through the exam, I have 81 students present. My total enrollment is 86, and Google Classroom has generated 85 exam copies. Now, I'm partly concerned because although this is an "open-resource" (note, textbook, internet) exam, I'm partly OK with this apparent remote-testtaking. However, one thing disallowed (this semester) is student collaboration (i.e. no talking, no texting or chat sessions). Thus, I'd like to be sure that the four missing students who are apparently taking the exam aren't physically together and collaborating on the exam. Yes, that's the ultimate goal (for me) in tablet courses, but this semester we're not yet there. Fortunately, because I don't curve my grades, this doesn't negatively impact any of the other students in class, but I still want to have fair and equitable testing.
So, can you please help me? If you were me, what control could I put in place to know what students are missing from class on test day? In a class of 86, there's no way I would be able to easily determine just by looking at faces which five students were not present. Yes, I could use a sign-in sheet, but that's easily cheatable if an in-class student decides to add the names of a few classmates who happen not to be present. How do I find out who is absent (but digitally present)?
Thanks for your input as I keep refining this approach of in-class digital exams that leverage internet access!
Now, back to proctoring - to make sure that the students taking my purportedly cheat-proof exam aren't cheating…?
I'm just writing three quick thoughts as I proctor my DISCOVERe tablet course's (BIOL 102: Genetics) second exam.
1) When having students complete open-note exams using tablets (by annotation of a PDF version of the exam), a vast number of students bring and access written notes. Some students do access digital notes and materials (i.e. the web) as well, but I'm hearing a lot of paper-shuffling.
2) I had an extreme Google Classroom fail last night (the type of professor nightmares!) I uploaded the exam PDF to the Classroom assignment I use for distributing and collecting the exams. I used one drop-down menu in Classroom to choose the "Make a copy for each student" (i.e., Classroom will send each student his/her own copy of the attached PDF exam file). Then, I (swear I) clicked on the downward-pointing triangle next to the "Assign" button (to access the "Save as Draft" feature), but Classroom just flat-out assigned the exam. Maybe I was tired and not mousing accurately… so at about 9 pm the day before the exam, Classroom sent the exam! Yes, I willingly acknowledge that a computer didn't do anything I didn't physically instruct it to.
Of course, I immediately deleted the assignment. To Google's credit (and perhaps for your future use), I learned that the student copy Classroom sends each student only gets created when the student loads the assignment on their computer! Thus, I was able to determine that a single student (whose name I know because Google Classroom adds the student name to each filename) was, indeed, able to access the exam in that ten-second span when I accidentally assigned it and then deleted it. To her extreme credit, she had already e-mailed me immediately to let me know she had accessed the file, not having realized what it was (honesty is the best policy - always!)
3) Here's the potentially bigger concern for me, and if you have a suggestion, I'd love to hear it! My brainstorm has fizzled into a drizzle at this point. During each exam, I keep a tally of students physically in the classroom. Right now, almost half-way through the exam, I have 81 students present. My total enrollment is 86, and Google Classroom has generated 85 exam copies. Now, I'm partly concerned because although this is an "open-resource" (note, textbook, internet) exam, I'm partly OK with this apparent remote-testtaking. However, one thing disallowed (this semester) is student collaboration (i.e. no talking, no texting or chat sessions). Thus, I'd like to be sure that the four missing students who are apparently taking the exam aren't physically together and collaborating on the exam. Yes, that's the ultimate goal (for me) in tablet courses, but this semester we're not yet there. Fortunately, because I don't curve my grades, this doesn't negatively impact any of the other students in class, but I still want to have fair and equitable testing.
So, can you please help me? If you were me, what control could I put in place to know what students are missing from class on test day? In a class of 86, there's no way I would be able to easily determine just by looking at faces which five students were not present. Yes, I could use a sign-in sheet, but that's easily cheatable if an in-class student decides to add the names of a few classmates who happen not to be present. How do I find out who is absent (but digitally present)?
Thanks for your input as I keep refining this approach of in-class digital exams that leverage internet access!
Now, back to proctoring - to make sure that the students taking my purportedly cheat-proof exam aren't cheating…?
Friday, February 19, 2016
Cheat-Proofing Exams IV: Student Feedback
As I've written about before, one of the immense values of teaching a class in which students are expected to use internet-accessible devices is that it allows new types of exercises and opportunities. This includes, for example (in genetics), accessing online sequence databases and having students perform analyses on DNA sequences using web-based tools.
Philosophy of the Cheat-Proof Exam
By writing exams focused less on lower Bloom's levels and more on analysis, evaluation and creation, my hope is that I'm getting closer and closer to crafting a truly cheat-proof exam. At the heart of the cheat-proof exam are two core aspects of course philosophy. First, what practicing professionals in the discipline do is called collaboration, not cheating. In fact, I actively promote collaboration during some exams - my only requirement of students is that they cite all sources (including personal notes, internet sites, and fellow students) when recording answers to questions. Otherwise, I tell them, copying is cheating. Two, when testing for higher Bloom's levels, it is easier to write cheat-proof exam questions: particularly those that would only have unique answers (such as essay-style response questions). In a genetics class, for example, I like the approach of starting questions with something like, "Access the NCBI database and find the sequence of any gene." Given the vast number of DNA sequences in this database, the probability that two students happen to choose to write an answer involving the same gene is infinitesimal (or smaller), so if this actually occurred, I'd have a strong case for provingcheating collaboration, if I were inclined to do so.
The Hitch
Of course, you've already spotted the trade-off. It is relatively easy to write a cheat-proof exam. It is more effort to grade them. As such, whether you decide to adopt the approach of writing cheat-proof exams depends almost entirely on your predisposition to spending time grading answers to a question where there is no single correct answer. My genetics classes usually have about 95 students in them, but I still include cheat-proof questions.
Solutions to the Problem
Digital Student Feedback on PDF Exams
What I do currently is to add "sticky note" text fields to the student PDF, and then I send those edited PDFs back to each student. This is a very time-intensive process, and the only information I put in those notes is: if a student earns less than 100% of points available for each question, I leave a note that indicates what score they earned. That's all of the feedback students received last term. Because I video-record and upload video exam keys, and because it is more useful for student learning for them to have to do the work to find out where they missed points on a question, I actively do not try to provide extensive feedback to students.
However, this semester, I'm making a slight change: I'm not even providing this much feedback. I'm finally, after reading about them for years, implementing "exam wrappers." The core idea here is that you provide a small assignment that requires students to consciously reflect on their performance on an exam. Often this takes the form of asking students, at the end of an exam (or just after they learn how they scored), what they think they should do differently to prepare for the next exam. Today, this is what I told my students about how they get feedback on their exam performance:
The Bottom Line
This is a new approach for me, so I'll post updates here on whether students take advantage of this opportunity, and also whether it winds up helping the students master the material. As I said in class today, my main concern is that students understand critical concepts (and be able to demonstrate that understanding) by the final exam. My philosophy is that, for the questions where most of the class doesn't perform well, I'll bring a related question back for the final exam. I want to foster student growth over the term. And, I tell the students this. So they should know, at this point in time, which topics will be showing up on the final exam. And, if they're willing to devote the work necessary to master those concepts, I think those students are the students who have truly earned A grades.
In sum, although you might not agree with this teaching/grading philosophy, I hope you will at least agree that there is value in three aspects of my design for cheat-proof exams:
Although my tablet exam approach is certainly a work in progress, it is meant to achieve these three goals. Time will tell!
Philosophy of the Cheat-Proof Exam
By writing exams focused less on lower Bloom's levels and more on analysis, evaluation and creation, my hope is that I'm getting closer and closer to crafting a truly cheat-proof exam. At the heart of the cheat-proof exam are two core aspects of course philosophy. First, what practicing professionals in the discipline do is called collaboration, not cheating. In fact, I actively promote collaboration during some exams - my only requirement of students is that they cite all sources (including personal notes, internet sites, and fellow students) when recording answers to questions. Otherwise, I tell them, copying is cheating. Two, when testing for higher Bloom's levels, it is easier to write cheat-proof exam questions: particularly those that would only have unique answers (such as essay-style response questions). In a genetics class, for example, I like the approach of starting questions with something like, "Access the NCBI database and find the sequence of any gene." Given the vast number of DNA sequences in this database, the probability that two students happen to choose to write an answer involving the same gene is infinitesimal (or smaller), so if this actually occurred, I'd have a strong case for proving
The Hitch
Of course, you've already spotted the trade-off. It is relatively easy to write a cheat-proof exam. It is more effort to grade them. As such, whether you decide to adopt the approach of writing cheat-proof exams depends almost entirely on your predisposition to spending time grading answers to a question where there is no single correct answer. My genetics classes usually have about 95 students in them, but I still include cheat-proof questions.
Solutions to the Problem
- It is guaranteed that writing exams that incorporate higher Bloom's levels will take more effort to grade. However, there are a few approaches that can make grading more efficient. First, write multiple-choice questions that still involve calculation, interpretation, analysis to determine the correct answer in the first place. Multiple-choice doesn't have to be limited to factual recall questions.
- I should also disclose that I also adjusted my grading scheme to account for the increased difficulty of exams that are designed to incorporate all levels of Bloom's taxonomy. There are six Bloom's levels, and I try to distribute point values evenly between them (I combine "evaluate" and "create" at the top of the tree). Thus, I align letter grades with Bloom's levels. Students that score 0-20% of possible points earn an F - they tend only to be able to perform factual recall. Between 20-40% of points on an exam earn a D; 40-60% a C; 60-80% a B; 80-100% an A. To earn an A, then, students have to be able to earn the upper-most 20% of points, which means they have to successfully answer the questions I write that require creating knowledge or critically evaluating information. In other words, I use Bloom's taxonomy to define my expectation of what students have to achieve to earn letter grades.
- Use group exams. This allows me to feel like I can write much more difficult questions, because groups can opt to distribute workload or at least to discuss options for solving a problem before selecting one. This has been (as expected) much more efficient for grading. If I have groups of four students working on an exam, I grade one fourth the number of responses.
- Leverage the tablet for collecting drawn responses. I don't know if it is true, or just my perception, but I feel like grading written responses takes longer than graphic (drawn) responses. If a picture is worth a thousand words, and if it is faster to assess a student-created picture than it is to read a thousand of their words (guaranteed!), then it is much more efficient to grade a visual answer to a question. In the sciences, one of the easiest ways to incorporate such a question is to ask a student to draw a plot (or diagram of results) of what they would expect in the situation that:________.
- Now to the title of the post! Last term (the first time I attempted the "open-note, open-internet" tablet-based exam), I only gave students limited feedback. The way I collect student exams from their tablets is that they submit them (via Google Classroom) as PDF files to me. This has so many benefits, including my ability to efficiently store student records and to easily transport them from office to home for grading (because we all know that grading often happens at home in the wee hours of the morning, right?). However, I have yet to find the best way to give students feedback on their exams…
Digital Student Feedback on PDF Exams
What I do currently is to add "sticky note" text fields to the student PDF, and then I send those edited PDFs back to each student. This is a very time-intensive process, and the only information I put in those notes is: if a student earns less than 100% of points available for each question, I leave a note that indicates what score they earned. That's all of the feedback students received last term. Because I video-record and upload video exam keys, and because it is more useful for student learning for them to have to do the work to find out where they missed points on a question, I actively do not try to provide extensive feedback to students.
However, this semester, I'm making a slight change: I'm not even providing this much feedback. I'm finally, after reading about them for years, implementing "exam wrappers." The core idea here is that you provide a small assignment that requires students to consciously reflect on their performance on an exam. Often this takes the form of asking students, at the end of an exam (or just after they learn how they scored), what they think they should do differently to prepare for the next exam. Today, this is what I told my students about how they get feedback on their exam performance:
- I want to motivate students to reflect on where they didn't meet my expectations
- I post exam keys (both a static PDF as well as the video key, in which I narrate and write/draw out answers to questions)
- Students have a digital copy of the exam they submitted
- Students only know their total score on the exam
- I show the entire class the letter grade distribution for the exam
- I also show data on the percent of students earning all of the points available for each question
- I expect students to compare their answers to the answer key and try to assess which question(s) they think the lost the most points on
- Each student has the opportunity to write/draw a response to me in which he/she chooses a single question on the exam (the one he/she thinks is most likely the one they lost the most points on) and writes a description of how/why they think they did not score all of the points. I said that, if the explanation is accurate, then the student can earn up to half of the missed points on that question.
The Bottom Line
This is a new approach for me, so I'll post updates here on whether students take advantage of this opportunity, and also whether it winds up helping the students master the material. As I said in class today, my main concern is that students understand critical concepts (and be able to demonstrate that understanding) by the final exam. My philosophy is that, for the questions where most of the class doesn't perform well, I'll bring a related question back for the final exam. I want to foster student growth over the term. And, I tell the students this. So they should know, at this point in time, which topics will be showing up on the final exam. And, if they're willing to devote the work necessary to master those concepts, I think those students are the students who have truly earned A grades.
In sum, although you might not agree with this teaching/grading philosophy, I hope you will at least agree that there is value in three aspects of my design for cheat-proof exams:
- Students should be assessed in authentic situations (which, for many disciplines, is going to involve accessing online materials and/or collaborating with others)
- Digital workflows for distributing and collecting exams can improve efficiency
- Our goal as educators is to make information relevant to students and to help motivate them to learn the material by the end of the course
Although my tablet exam approach is certainly a work in progress, it is meant to achieve these three goals. Time will tell!
Friday, February 12, 2016
Collaborative academic rap
I've written before about how to use Google Apps (e.g. Sheets) for real-time multi-student collaborative editing. In class this term, I've asked students to form their own groups in order to summarize each chapter's worth of content by collaboratively writing a verse using Google Docs.
For example, one student group penned (digitized?) the following after our first chapter, on the concept of natural selection:
- de Guzman, McDonald and Olvera
One benefit of using Google Apps and internet-accessible mobile devices in such an exercise is that students not only form groups to help each other learn, reflect on which topics are most important, and distill complex concepts into more simple forms (things that could be done without tablets), but that they can conduct collaborations asynchronously and from any location. More simply put,
"Recognize that one of our limitations
is our limited time together.
Fifty minutes at a time is never enough;
a passionate teacher won't settle for less than forever
Group work is easy when you're in the same space,
but when your group members leave campus, you're all apart
Each take a tablet with a cup of Google apps
And you've got the time to build lyrics a'la carte"
If nothing else, the above should demonstrate why I leave the incorporation of art into a science class to the students: they come up with many more clever rhymes than I could ever hope to design. More importantly, though, tablets allow everybody to provide input to reach a common goal at times (and places) convenient to each group member.
Here are the technical details to the approach. Assuming each student has the Google Docs app (available on most, if not all, tablet and smartphone platforms for free), have students form small groups. Each group chooses one member to create a Google Doc, and then share editing access to the group members by collecting their e-mail addresses and inviting them by adding their e-mail addressing in the "Share" function in Google Docs. Make sure have each group add you, as the instructor, as well. Also, make sure to be clear that each group document should contain a list of all of the group members at the top.
Then, set a deadline. After the deadline, you can navigate to the header of each Google Doc, where there will be a link to the Edit History (e.g. "Last edit was made 2 hours ago by User1"). If you click on that statement, you get taken to a list of all contributions by each group member (assuming that each has signed in via a Google account). This is how I check to see whether each group member at least contributed to the content.
At this point, we are about to embark on our second bout of lyric production. We're going to continue this for the entire term. I'm especially excited about this because, just after we wrote our first set of lyrics, I noticed that acclaimed rapper Baba Brinkman (famous in scientific circles for his academic rap works, the most notable of which is the "Rap Guide to Evolution," which he has performed for us in the past at Fresno State) is currently crowdfunding his current project, "The Rap Guide to Climate Change" on indiegogo.com.
My Evolution (BIOL 105) class, presented with the opportunity to help fund this album, in exchange for Baba writing and recording a custom rap song (to potentially include some of my students' lyrics!), raised the $1,000 necessary to make this a reality! I sent the funds to Baba's campaign today (February 12), in celebration of Darwin's birthday.
I'd like to thank Baba for being willing to collaborate on this project and thank the students for seeing the long-term benefit of rewriting evolutionary concepts as rap verses (which we'll also use by the end of the term to compile a study guide for the final exam). I'm always looking for ways to integrate the arts into science courses, and so far this approach (especially enhanced by our tablets) has met with great enthusiasm and support!
Rhymes
(the pedagogical philosophy)
For example, one student group penned (digitized?) the following after our first chapter, on the concept of natural selection:
"It's all about change across generations' time
It's not in a single life, that ain't our rhyme
That theory came from our man Lamarck
But evolution's DNA, so that didn't work.
It's not a perfect system, the pick's a random roulette
The better go on to proliferate, that you can bet
We don't choose our traits, Mother (Nature) knows best
And now that you know that, here's the rest…"- de Guzman, McDonald and Olvera
One benefit of using Google Apps and internet-accessible mobile devices in such an exercise is that students not only form groups to help each other learn, reflect on which topics are most important, and distill complex concepts into more simple forms (things that could be done without tablets), but that they can conduct collaborations asynchronously and from any location. More simply put,
"Recognize that one of our limitations
is our limited time together.
Fifty minutes at a time is never enough;
a passionate teacher won't settle for less than forever
Group work is easy when you're in the same space,
but when your group members leave campus, you're all apart
Each take a tablet with a cup of Google apps
And you've got the time to build lyrics a'la carte"
If nothing else, the above should demonstrate why I leave the incorporation of art into a science class to the students: they come up with many more clever rhymes than I could ever hope to design. More importantly, though, tablets allow everybody to provide input to reach a common goal at times (and places) convenient to each group member.
Tablet Use
(boost the digital intensity)
Here are the technical details to the approach. Assuming each student has the Google Docs app (available on most, if not all, tablet and smartphone platforms for free), have students form small groups. Each group chooses one member to create a Google Doc, and then share editing access to the group members by collecting their e-mail addresses and inviting them by adding their e-mail addressing in the "Share" function in Google Docs. Make sure have each group add you, as the instructor, as well. Also, make sure to be clear that each group document should contain a list of all of the group members at the top.
Then, set a deadline. After the deadline, you can navigate to the header of each Google Doc, where there will be a link to the Edit History (e.g. "Last edit was made 2 hours ago by User1"). If you click on that statement, you get taken to a list of all contributions by each group member (assuming that each has signed in via a Google account). This is how I check to see whether each group member at least contributed to the content.
Moving Forward
(collaborative propensity)
At this point, we are about to embark on our second bout of lyric production. We're going to continue this for the entire term. I'm especially excited about this because, just after we wrote our first set of lyrics, I noticed that acclaimed rapper Baba Brinkman (famous in scientific circles for his academic rap works, the most notable of which is the "Rap Guide to Evolution," which he has performed for us in the past at Fresno State) is currently crowdfunding his current project, "The Rap Guide to Climate Change" on indiegogo.com.
My Evolution (BIOL 105) class, presented with the opportunity to help fund this album, in exchange for Baba writing and recording a custom rap song (to potentially include some of my students' lyrics!), raised the $1,000 necessary to make this a reality! I sent the funds to Baba's campaign today (February 12), in celebration of Darwin's birthday.
I'd like to thank Baba for being willing to collaborate on this project and thank the students for seeing the long-term benefit of rewriting evolutionary concepts as rap verses (which we'll also use by the end of the term to compile a study guide for the final exam). I'm always looking for ways to integrate the arts into science courses, and so far this approach (especially enhanced by our tablets) has met with great enthusiasm and support!
Wednesday, February 10, 2016
Digital Scabs
No, not those scabs! I'm talking about picket-line-crossing. And this entire post won't be a rant about policies and unions, although it starts off sounding like it. Please read on - there's digital pedagogy relevance down there somewhere:
If you haven't heard, the union of faculty in the California State University system (the California Faculty Association, CFA) has approved strike action for five days in mid-April if the CSU administration doesn't return to the bargaining table to negotiate between their offer of a 2% salary increase and the CFA request of a 5% increase.
According to some reports, this would be a historic (in the USA) strike, partly because the CSU system is the largest university system in the country.
I'm not going to announce here which side of the argument I come down on. However, I will say that, as a DISCOVERe (tablet instruction) faculty fellow at CSU Fresno, I will certainly admit that my first thought, when I heard about the impending strike, was: would a strike impact digital instruction?
My primary concern about the strike is that both sides of the disagreement seem to have made similar statements about the potential effect of a strike on students. In various media outlets, reports have stated that the CSU administration has indicated that a strike should have no effect on student graduation or timely degree progress (I'll note that they were quoted as using the word "should" and not "would"…) Likewise, CFA representatives apparently have made the same claim - leaving me wondering what incentive a strike has to motivate CSU administration if student success isn't placed in the crosshairs? This approach doesn't make sense to me. But, never mind that…
A number of my colleagues, in opinions expressed over years past, have consistently stated that they feel taken advantage of, because administrators know that we put as much time and effort into our passion: teaching and mentoring students, because we love what we do. Part of our reimbursement, in some way, is the good feeling we get by the job we do. That's worth something to us, but it is hard to put a dollar sign on. And, that may be why a strike could be necessary - to remind administrators that we won't always be pushovers when it comes to increased course loads/enrollments without increases in compensation.
However, I'm dangling in the balance. On one hand, I understand (intellectually) the important of solidarity in union affairs. I'm particularly sensitive to this as an untenured faculty member. I suspect I'd be more likely to consider crossing a picket line if I had tenure and was relatively more immune to what fellow faculty members would think of me afterwards.
All of this leads back to the same point: how might a strike impact digital instruction? I'd love to hear some of your comments, especially from any of you who might have been through this before. Would I be a "digital scab" if I established, in advance of the dates of planned strike action, all of the reading assignments, exercises, pre-recorded lectures to watch, exams, etc.? Sure, I might not do any active faculty work (reading/responding to e-mails, committee work, class prep, grading, teaching, etc.) during the strike, but am I weakening the faculty cause if I use the digital pedagogy tools my institution has helped me develop to help students continue learning on their own in my potential absence? So far this term, I've done my best to set my students up to succeed as self-learners. Maybe a faculty strike is in their best interests? A not-so-gentle nudge into the deep end of the pool. Maybe the students don't really need me at this point. Maybe this is how we'll find out…
What's a professor to do?
If you haven't heard, the union of faculty in the California State University system (the California Faculty Association, CFA) has approved strike action for five days in mid-April if the CSU administration doesn't return to the bargaining table to negotiate between their offer of a 2% salary increase and the CFA request of a 5% increase.
According to some reports, this would be a historic (in the USA) strike, partly because the CSU system is the largest university system in the country.
I'm not going to announce here which side of the argument I come down on. However, I will say that, as a DISCOVERe (tablet instruction) faculty fellow at CSU Fresno, I will certainly admit that my first thought, when I heard about the impending strike, was: would a strike impact digital instruction?
My primary concern about the strike is that both sides of the disagreement seem to have made similar statements about the potential effect of a strike on students. In various media outlets, reports have stated that the CSU administration has indicated that a strike should have no effect on student graduation or timely degree progress (I'll note that they were quoted as using the word "should" and not "would"…) Likewise, CFA representatives apparently have made the same claim - leaving me wondering what incentive a strike has to motivate CSU administration if student success isn't placed in the crosshairs? This approach doesn't make sense to me. But, never mind that…
A number of my colleagues, in opinions expressed over years past, have consistently stated that they feel taken advantage of, because administrators know that we put as much time and effort into our passion: teaching and mentoring students, because we love what we do. Part of our reimbursement, in some way, is the good feeling we get by the job we do. That's worth something to us, but it is hard to put a dollar sign on. And, that may be why a strike could be necessary - to remind administrators that we won't always be pushovers when it comes to increased course loads/enrollments without increases in compensation.
However, I'm dangling in the balance. On one hand, I understand (intellectually) the important of solidarity in union affairs. I'm particularly sensitive to this as an untenured faculty member. I suspect I'd be more likely to consider crossing a picket line if I had tenure and was relatively more immune to what fellow faculty members would think of me afterwards.
All of this leads back to the same point: how might a strike impact digital instruction? I'd love to hear some of your comments, especially from any of you who might have been through this before. Would I be a "digital scab" if I established, in advance of the dates of planned strike action, all of the reading assignments, exercises, pre-recorded lectures to watch, exams, etc.? Sure, I might not do any active faculty work (reading/responding to e-mails, committee work, class prep, grading, teaching, etc.) during the strike, but am I weakening the faculty cause if I use the digital pedagogy tools my institution has helped me develop to help students continue learning on their own in my potential absence? So far this term, I've done my best to set my students up to succeed as self-learners. Maybe a faculty strike is in their best interests? A not-so-gentle nudge into the deep end of the pool. Maybe the students don't really need me at this point. Maybe this is how we'll find out…
What's a professor to do?
Sunday, January 24, 2016
Course redesign: body by BYOD
My goal is not to convince every teacher and student that using a tablet computer in the classroom is the panacea for student success, much less convert every course to incorporate technology. Over my last three semesters of teaching some of my classes in a tablet-based fashion, there are several key lessons learned. The most critical of these is:
Understand when a tablet is useful and when it is not
Related Best Practices are:
- Take advantage of things a tablet can do that a smartphone or laptop cannot do as easily
- Don't force your students into money-spending situations, be it tablets or apps or other course materials
- Be equitable: don't enact policies that are exclusive to certain student groups
There will be times to adhere to Best Practice 1 (leverage the capabilities of a tablet). Those times might include instances where form factor, touchscreen, and processing power are important. For example, a teacher might prefer tablets over laptops or smartphones:
- in a classroom with small desks and poor wi-fi internet access where the purpose of employing technology is to facilitate student writing (cellular internet access and a larger keyboard makes a tablet a top choice over a smartphone or laptop)
- in a course where travel/site visits, including photography/videography, are frequent (cellular internet access and built-in camera, as well as relatively small size/weight are key)
However, there are plenty of engaging activities to pursue in the classroom that could use a smartphone, tablet or laptop. Every one of us can easily improve course efficiency and/or student engagement with such a Bring Your Own Device (BYOD) approach. This is a useful approach when your student demographic and campus resources indicate that every student can have an internet-enabled device in class (e.g. many campuses have student laptop loaner programs). Unless you have extensive upper administration support (e.g. financial and infrastructure), BYOD will be the best way to start.
Being device-agnostic (and operating system-agnostic, too) is especially important to address Best Practices 2 (keep costs low) and 3 (be equitable). If all students in your course already have a smartphone, laptop, or tablet, it might be best to accommodate their devices and not require new purchases.
A key step in building your BYOD course is to find free apps that are platform-agnostic (e.g. work on Apple, Microsoft, and Android operating systems). Fortunately, there are many quality apps that are free and are available through the Apple and Microsoft app stores and Google Play. The best way to ensure that your app is device-agnostic (e.g. works on mobile operating systems + laptops) is to find web-based apps.
A great example of this is Socrative, which can be run either through a mobile app or in a web browser. This app is great for in-class quizzes, informal assessments, and even taking attendance, it is easy for students and instructors to use, and it is free. Other key apps include Google apps (e.g. Drive, Docs, Sheets) and a PDF viewer/annotater (e.g. Adobe Reader; Xodo).
Enjoy imagining and executing the vision for your BYOD course redesigned to incorporate technology! If you're still not sure how to start, here's a great first step: identify one process in your class that could be more efficient using technology. For me, technology was the immediate solution to taking attendance (though I understand not all of you record attendance in your classes). Find one app that you can use to accomplish your one goal. This is not always easy to accomplish. Thus, on campus, I'm advocating having our tech-engaged faculty actively reach out to colleagues to facilitate this initial step of reflection, as it is understandably difficult to self-critique when one might not be familiar with the ways technology might be able to improve efficiency and student engagement. If you have these colleagues, please reach out to them for their advice and feedback! You have one such colleague here, so please post any feedback or questions below.
Sunday, January 17, 2016
Straight from horses' mouths: Fall '15 data roundup
Now that two semesters of tablet instruction have passed, I'm enjoying learning from my students which aspects of my mobile technology instruction are successful and which are not. Reviews of last term's assessments have revealed some interesting responses and trends. My hope is that sharing these student attitudes and impressions with you will both catalyze changes to your teaching as well as provide some cautions about teaching with mobile devices. To start, let me briefly recap what I've been doing in the classroom (Approaches), and why (Goals).
I'll share here the good (and the not-so-good) data on progress toward achieving these goals. These data and student comments are from IRB-approved assessments from the Fall 2015 semester of my genetics course.
Resources that support student learning
97% of students reported that they used instructor-recorded video to help study for exams, and 90% of students reported that watching recorded lectures was critical.
"I really like the tablets because I could see the lecture notes on my tablet clearly and I also really like the recording of the lectures because those really help me with studying for the test."
Over half of the students indicated interacting with other students during class more or much more than normal. Over 80% of students reported remaining in the same groups during the semester. Only half of the students knew their other group members prior to the start of the semester, but 62% of students reported interacting with each other outside of class. This suggests, to me, that employing group work as part of this course might have improved student sense of belonging and also enhanced peer instruction.
Exposing students to authentic disciplinary practices
Much to my satisfaction, the vast majority of students reported that using a tablet computer in this class helped them grow "some" or "a lot" in the following areas:
Using evidence-based teaching practices
Two-thirds of students reported that focusing in-class time on exercises and question-and-answer sessions contributed to their success in the course.
When asked about how study habits changed during the term, student responses included (emphases mine):
Other tidbits
Tablet Course Redesign
Goal | Approaches | Apps Employed |
Provide students with resources that support their studying and learning |
|
|
Expose students to authentic disciplinary practices |
|
Web browser |
Employ evidence-based teaching practices |
|
I'll share here the good (and the not-so-good) data on progress toward achieving these goals. These data and student comments are from IRB-approved assessments from the Fall 2015 semester of my genetics course.
Resources that support student learning
97% of students reported that they used instructor-recorded video to help study for exams, and 90% of students reported that watching recorded lectures was critical.
"I really like the tablets because I could see the lecture notes on my tablet clearly and I also really like the recording of the lectures because those really help me with studying for the test."
Much to my satisfaction, the vast majority of students reported that using a tablet computer in this class helped them grow "some" or "a lot" in the following areas:
- having authentic experiences in genetics
- achieving a better understanding of the relevance of genetics to the student personally, or to humanity
- improving quantitative reasoning skills (graphic, predicting, estimating, analyzing data sets)
- improving information literacy skills (knowing where to find information relevant to genetics, how to evaluate its legitimacy)
- working collaboratively in groups
Using evidence-based teaching practices
Two-thirds of students reported that focusing in-class time on exercises and question-and-answer sessions contributed to their success in the course.
When asked about how study habits changed during the term, student responses included (emphases mine):
- For other courses, I usually rely on the textbook and homework problems to study concepts. In this class, I was more interested in studying the relevence and application of concepts, which prompted me to rely more on outside resources like the internet. I focused more on interactive, application problems than just memorizing facts.
- I started to go through the audio recordings over and over again. Also stopped taking notes and started paying attention.
- I really liked how we had to read the chapter we were going to discuss in class, before class. I would love to do this for every class, but i dont have the incentive too and am too busy. But in this class, it was part of the hw grade, so i did it knowing it was not a waste of time but counted towards my grade.
Other tidbits
- 65% of students typed notes using their tablet devices
- 91% of students felt like they used their tablets in class as much as they had expected
- Importantly, almost 90% of students self-reported spending the majority of time spent in class using their tablet for on-task activities.
The not-so-good results (or, "Things to keep in mind when deciding how/whether to use mobile technology in the classroom")
Over a third of students indicated that they asked questions in class either less or much less than normal.
When asked about whether students felt that using tablets reduced the quality of instruction, some of the more negative responses included:
- Difficulty concentrating due to tablet and constant shift in attention among multiple apps
- It is very helpful, but some students can get easily distracted from it as well
- Sometimes the professors would have issues with technology, which in turn kind of look away time from the students.
- I prefer taking notes on paper and by writing it helps me more to remember
- I feel that I spent a lot more time figuring out how to use apps and the tablet to submit assignments than actually learning some of the material. Spent more time learning how to use the apps.
Looking to the future
Although a sizeable portion (about 15%) of students disagreed that a tablet-based class was better than a traditional lecture class, the vast majority of students (including many in that 15%) said that they would suggest to friends that they enroll in a tablet course.
Beyond this course, academic IT approaches are spreading on campus, with 84% of students in my class indicating that they used their tablet for coursework in other courses.
Conclusion
In sum, I'm buoyed by the successes I've had in redesign, but there are clearly places where my course redesign with technology needs tweaking, including:
- more consciously considering the degree to which I demand that students use technology (e.g. as opposed to paper)
- the number of ways (and therefore number of different apps) we use the tablets each day
Finally, as always, it is even more clear that managing student expectations and articulating instructor goals, motivations, and expectations is critical to ensure the best possible experience for students entering the unfamiliar and potentially stressful environment of a technology-redesigned course.
Subscribe to:
Posts (Atom)