Saturday, February 17, 2018

Preaching to the choir

Last week, I gave an invited presentation at my local Mac User Group meeting (Fresno Mac Users Group, FMUG)
180206 FMUG Joe.jpg
The MUG learned about me through one of my former colleagues, who had seen me give an untethered presentation (about the biology research my undergrad and grad students and I conduct at California State University, Fresno) using my iPad and iPhone, and ExplainEverything. You can view that presentation here, if you're interested in learning more about mutations, in a presentation geared to a lay audience:
I hadn't known we had a MUG in central California! I was elated to be asked, and it was great to be back in the company of local die-hard Apple evangelists. When I was in high school in Oregon, I took regular advantage of that local MUG (in Corvallis), and I vividly recall borrowing my mom's car to drive there monthly to rub elbows with the other Apple geeks. My strongest memory of the CMUG meetings was the one where I purchased (and had installed) 4x 1 MB RAM chips to replace the 4x 256k RAM chips in the MacPlus I had inherited from my parents when they upgraded to a 9500.
Yes, my family are famously: slow adopters of new tech and/or advocates of the "drive it until it dies" approach (cars in my family tend to be on the road 20+ years before they get replaced). I'm the same way, which sometimes makes me a less-capable advocate of technology because I'm often behind the curve on having the latest and greatest, but that's another story.
I had three main goals of my presentation to FMUG (an overview video, produced in Clips, is at:
Goal 1. To demonstrate how Apple builds accessibility features into their hardware and software. As an example, I used these screen shots that show how inaccessible (to us red-green color-blind folk) color pickers used to be, as recently as System 7, shown here:
The introduction of the crayon color picker may have seemed childish, but that the colors were accompanied by names was a huge deal to those of us who rely on this sort of information so that we color grass green and not brown or red!
macos80 first crayon picker.png
Here, just for fun, is a current version of the Apple color-picker (left) and a color-blind simulated version (right). To me, the two images look identical. To you, if you aren't color-blind, this gives you an idea of how I view the world and why it is useful to have color accessibility.
Normal & CB.png
For more information, please refer to this resource for choosing color-blind friendly colors to use in media you produce!
Goal 2. To broadcast information about Fresno State's "DISCOVERe" program, launched in 2014, which is a 1:1 tablet program (but not exclusively Apple, and probably inevitably moving to a BYOD approach). As a program that is entirely voluntary for faculty to join, and then design at least one course to use mobile devices to increase student engagement and success, we have quadrupled the number of DISCOVERe courses, as well as students enrolled in DISCOVERe classes, over the past three years; we have almost 200 faculty fellows of the program now, while our initial cohort (of which I was a member) numbered 32. Most importantly, I wanted to share with FMUG (and you, too!) that in a single semester last year, DISCOVERe faculty who chose to adopt open educational resources, like iBooks, saved our students a total of over $117,000 in course materials costs!
Goal 3. To share my love of Clips. The highlight of the presentation was a live demonstration of Clips. I first showed a few examples of Clips I had produced, after which questions from the audience immediately turned to the captioning (which I pointed out was a critical component of accessibility).
Not surprisingly (to those with Clips experience), the biggest audible *gasp* in the room came when I gave a live demonstration of Live Titling. Many audience members had previously asked how I created the captions, but I held off on answering those questions until I got to this point!
After I was done demonstrating Clips, I was surprised to hear from the audience that most of them had not heard about Clips and wondered, after I showed them the basics of using this app, why Apple hadn't made a bigger push to promote Clips! I told them that this was one reason I was here, and that I would continue to be a resource for them in their adoption of Clips.
In every presentation I give, I want to make sure the audience leaves with information or a skill of immediate value. So, tonight, I asked all of them to brainstorm how they might use Clips in their lives. We had a great conversation about using the app to (for example) send multimedia greetings to family members via social media, and one kingergarten teacher asked me if it would make sense for her to record herself reading picture books for her students.
I remarked both that Clips would be reasonble (for short books like picture books) but I emphasized a

Key Benefit

of Live Titling in videos: that this accessibility feature is not just for the deaf, but that non-native English speakers (and those, like children, learning to read and write English) really benefit from hearing the word and seeing it written at the same time!
By the end of the meeting, I had demonstrated another way to use Clips: as an aid to newcomers (to any event or organization). Clips, especially in conjunction with time-lapse video using Camera, are great to show people how to navigate from one place to another, as I did here for students at the start of the semester to show them how to get from our lecture classroom to my office on campus:
This could work just as well for directions to a church, to your house for birthday party invitees, and so on. Here, I used Clips to provide FMUG a resource to share with new members to help them find the FMUG meeting space!
I was honored to share the power of Clips with my new Apple allies in Fresno and to build new  connections between my campus and community. Again, you can view my FMUG Clip here:

Student Attitudes about Classroom Clips Videos

Last (Fall) semester, I incorporated some newly-developed technological skills, related to Apple's Clips app, to augment my upper-division biology majors class in genetics using instructor-created and brief videos for students to use in various scenarios. I've blogged about these before:
• Introducing yourself; your classroom
• Class "trailer" videos to on-board students to the topic of each class period
• Jigsaw exercises and remote instruction
• Describing and annotating processes in locations it is difficult to get your entire class to
Separately, I created a digital course manual (which, sadly, is not in iBook format yet) where I curated these Clips and other movies and resources I created, as well as open-education resources. I distributed this resource, called "GATC" (for "Genetics Assets and Tools Companion"), to my class at the start of the term.
  My intro Clip for students on using the GATC:
I'm happy to report that I just received my written comments from student ratings of instruction, and here are excerpts from those related to these videos and digital content I was able to produce using Clips and other Apple (and related) software and hardware: iPad, MacBook Air, QuickTime, and ExplainEverything (for lecture capture video recording)
I'm sharing these to suggest that the blog posts above may contain some valuable practices for using short videos in the context of a student's classroom experience. For full disclosure, I should also mention that, as part of my course augmentation with technology, I also employed pre-class video lectures that students watched before coming to class; this was not accomplished with Apple hardware or software (see an example of one of my Lighboard, a.k.a. Learning Glass, videos at:

Student comments

"I love the GATC Course manual because he highlights the exact topics that we should focus on and he makes lots of information easily available to his students."
"I appreciate the recorded lectures that are posted on YouTube. Sometimes I don't get stuff down in time and it's nice to be able to go back and listen to Dr. Ross' explanations again."
"The class discussions made it easier to learn the information. The before class videos were also very helpful."
"The way in which Dr. Ross incorporates technology allows for his students to better understand the material at hand."
In all, I think a critical point here, for dissemination as broadly as possible, is that educators can make positive impacts on students using mobile technology even if the students don't all have a device. Apple technology only in the hands of the instructor can still help engage and inspire students!

Thursday, November 30, 2017

Clips in live presentations about using Clips in and out of the classroom


I've written before about using Apple's Clips app to create a variety of types of videos (including trailers and jigsaw microlectures) to drive student interaction and to help them understand the importance (or relevance) of material. Some relevant posts are:

Today, I gave a presentation about the use of videos in (and out of) class to engage students. My audience was the Directors of Educational Technology in California Higher Education (DET/CHE) 2017 conference. I took advantage of this opportunity to demonstrate a new (to me, at least) use of Clips in live presentations.

Why Clips?


Although I wanted lots of platform time to give my presentation, I was only afforded a ten-minute lightning talk (but thanks to DET/CHE for accepting my application and offering even that much time!) Now, I assume I'm not alone in feeling a bit of stress and nervousness before I give any sort of presentation, especially to this many people with such a wide background (a relative handful of faculty; mostly administrators, technical staff, and instructional designers). However, my nervousness is not about speaking in front of large groups (which I actually really enjoy…) - the only real source of concern I have is that I keep to my allotted time! I have a tendency to let my remarks run long…
Panorama of the audience in the ballroom

Another challenge (which I didn't know about beforehand, but should have learned to expect at conferences, by now) is that the ballroom projects to two screens. I always find this awkward for presentation, because I have a tendency to want to use a LASER pointer to point out specific items on my slides. However, in the "dual screen" conundrum, one has to favor pointing on one screen (and thus only to half of your audience). Talk about the "digital divide!"

Panorama of the audience view of the front of the ballroom

Why Clips?

Yes, and…

My feeling was that Clips could help me address both obstacles.
  • I would create my presentation media in Clips, by recording all of my pre-prepared slides into a single Clip, and then export it as a movie file that I would play as a projected video during my live presentation. Because of the ability to control the lengths of individual component videos of a Clip, I would have control over how much time I could spend on each topic (slide) I wanted to cover, and thus be able to fine-tune the total length of my presentation to fit within my ten minutes.
  • As my presentation is on the use of videos in the classroom, I also wanted to demonstrate the abilities of Clips itself. So, not only did building the original video in Clips make sense, but it also allowed me to demonstrate Clips features that are difficult to incorporate into one, seamless, presentation: like adding posters and stickers. So, by pre-recording my Clip to be my projected presentation material, I was able to add in those elements. Critically, this helps with the "dual screen" conundrum, because all of those graphic elements added in Clips to help provide contextual information and to highlight specific parts of slides are projected along with the video itself - so both sides of the room get the same animated presentation.
  • Another benefit is that, if you are nervous and shaky with a LASER pointer when you give presentations, this approach avoids the need to use such a pointer, even if you are only projecting to a single screen. Also, if you are presenting on flatscreen TVs (for example), on which LASERs don't show, this is the perfect solution: do all of your "pointing" within Clips!


Before the Presentation

  1. Create slides in PPT (static graphics); export each slide as its own image
  2. Locate any existing video material (like Clips!) that you want to incorporate into your presentation
  3. Move those graphics into your Photos library to access from Clips
  4. Record (and voiceover) each slide image and movie within Clips - this ensures that you record the appropriate length of time for each slide or movie for you to be able to provide live oral remarks when you replay the Clip to your audience and provide your live comments at the time.
  5. Export the completed Clip as a video to your Photos library

During the Presentation

  1. Connect your iOS device to projection
  2. Launch Clips
  3. Access the exported video of your Clip in your Photos Library
  4. Make sure live captioning is enabled but doesn't display in real time as you speak (mis-translation during your live presentation might be distracting, I've found…)
  5. Start recording (using the recording lock is a good idea, so you don't have to hold the "Record" button with your finger during your entire presentation!)
  6. Speak as the movie imports into Clips - deliver your remarks as the movie runs


When you export your Clip as a movie, make sure that your live titles are hidden, so that they don't appear on the video while you're speaking. That way, when you say something different than you did when you originally recorded the clip, it isn't obvious to the audience!

Drawbacks to Clips for live presentation

You must download your Clip movie to your Photos Library in advance - do a dry run of using Clips for presentation before you go live - that way you won't have to wait for the video to download and load before you can begin!
If you think you won't have internet access on your iOS device for live presentation, then you can't live caption. In that case, one workaround is to use a different device to record your audio while you present, and then use that audio track, played into the speaker of the iOS device running clips, to "live caption" the presentation later.
The visual format of the Clips app does not have a "presentation view" (nor should it, as this isn't its function), so the audience sees all of your Clips screen in a live presentation/recording approach. A related issue can be that the square format of a Clips video might project in too small a format for audience viewing, depending on the set-up of your presentation space. This issue with format is currently compounded by relative little user control over font size in Clips-generation banners, stickers, and the like.

Alternative solutions that don't involve Clips:

  • Keynote and Powerpoint can do timed slides and embedded video
  • Instead of using Clips for the actual presentation, one can simply play the exported movie in full-screen mode (this is a "cleaner" look, as it doesn't involve projecting the Clips user interface along with your content).

Is using Clips efficient for live presentation?

After I created my slides (in PowerPoint), it didn't take me long (maybe an hour) to create the ten-minute clip, and perhaps another half-hour to edit the live captions - mostly adding punctuation, which doesn't automatically happen.

A side benefit of preparing the presentation in advance is that you can export the live-titled (captioned) video to your favorite social media site to share with the world before you even give your presentation! I even opted to share the YouTube URL for the presentation at the end of my talk! (

As a final aside, I also used Clips to create trailers to promote attending my presentation:

Since my presentation this morning, I've been receiving feedback from colleagues about the apparent power of Clips and the multiple ways I use videos (especially those made with Clips) in higher ed to engage students. With the help of Clips, my lightning talk turned out to be enlightening to my peers!